Deterministic Approximation Methods in Bayesian Inference
نویسنده
چکیده
In this seminar paper we give an introduction to the field of deterministic approximate inference. We cast the problem of approximating a posterior distribution over hidden variables as a variational minimization problem. In this framework we describe three algorithms: Variational Factorization, Variational Bounds and Expectation Propagation. We analyze the approximations obtained by the three algorithms in terms of convergence and accuracy. This gives a hint on which method is preferable in a specific problem at hand.
منابع مشابه
Estimation of the Parameters of the Lomax Distribution using the EM Algorithm and Lindley Approximation
Estimation of statistical distribution parameter is one of the important subject of statistical inference. Due to the applications of Lomax distribution in business, economy, statistical science, queue theory, internet traffic modeling and so on, in this paper, the parameters of Lomax distribution under type II censored samples using maximum likelihood and Bayesian methods are estimated. Wherea...
متن کاملMeasuring the Hardness of Stochastic Sampling on Bayesian Networks with Deterministic Causalities: the k-Test
Approximate Bayesian inference is NP-hard. Dagum and Luby defined the Local Variance Bound (LVB) to measure the approximation hardness of Bayesian inference on Bayesian networks, assuming the networks model strictly positive joint probability distributions, i.e. zero probabilities are not permitted. This paper introduces the k-test to measure the approximation hardness of inference on Bayesian ...
متن کاملPiecewise Linear Approximations of Nonlinear Deterministic Conditionals in Continuous Bayesian Networks
To enable inference in continuous Bayesian networks containing nonlinear deterministic conditional distributions, Cobb and Shenoy (2005) have proposed approximating nonlinear deterministic functions by piecewise linear ones. In this paper, we describe two principles and a heuristic for finding piecewise linear approximations of nonlinear functions. We illustrate our approach for some commonly u...
متن کاملStochastic Annealing for Variational Inference
We empirically evaluate a stochastic annealing strategy for Bayesian posterior optimization with variational inference. Variational inference is a deterministic approach to approximate posterior inference in Bayesian models in which a typically non-convex objective function is locally optimized over the parameters of the approximating distribution. We investigate an annealing method for optimiz...
متن کاملInference in Hybrid Bayesian Networks with Nonlinear Deterministic Conditionals
To enable inference in hybrid Bayesian networks containing nonlinear deterministic conditional distributions using mixtures of polynomials or mixtures of truncated exponentials, Cobb and Shenoy in 2005 propose approximating nonlinear deterministic functions by piecewise linear ones. In this paper, we describe a method for finding piecewise linear approximations of nonlinear functions based on t...
متن کامل